An interior-point gradient method for large-scale totally nonnegative least squares problems
نویسندگان
چکیده
We study an interior-point gradient method for solving a class of so-called totally nonnegative least squares problems. At each iteration, the method decreases the residual norm along a diagonally scaled negative gradient direction with a special scaling. We establish the global convergence of the method, and present some numerical examples to compare the proposed method with a few similar methods including the affine scaling method.
منابع مشابه
An Efficient Method for Large-Scale l1-Regularized Convex Loss Minimization
Convex loss minimization with l1 regularization has been proposed as a promising method for feature selection in classification (e.g., l1-regularized logistic regression) and regression (e.g., l1-regularized least squares). In this paper we describe an efficient interior-point method for solving large-scale l1-regularized convex loss minimization problems that uses a preconditioned conjugate gr...
متن کاملAn Interior-Point Method for Large-Scale l1-Regularized Least Squares
Recently, a lot of attention has been paid to l1 regularization based methods for sparse signal reconstruction (e.g., basis pursuit denoising and compressed sensing) and feature selection (e.g., the Lasso algorithm) in signal processing, statistics, and related fields. These problems can be cast as l1-regularized least squares programs (LSPs), which can be reformulated as convex quadratic progr...
متن کاملGradient methods and conic least-squares problems
This paper presents two reformulations of the dual of the constrained least squares problem over convex cones. In addition, it extends Nesterov’s excessive gap method 1 [21] to more general problems. The conic least squares problem is then solved by applying the resulting modified method, or Nesterov’s smooth method [22], or Nesterov’s excessive gap method 2 [21], to the dual reformulations. Nu...
متن کاملAn interior point Newton-like method for non-negative least-squares problems with degenerate solution
An interior point approach for medium and large nonnegative linear least-squares problems is proposed. Global and locally quadratic convergence is shown even if a degenerate solution is approached. Viable approaches for implementation are discussed and numerical results are provided.
متن کاملA Comparison of Block Pivoting and Interior-point Algorithms for Linear Least Squares Problems with Nonnegative Variables
In this paper we discuss the use of block principal pivoting and predictor-corrector methods for the solution of large-scale linear least squares problems with nonnegative variables (NVLSQ). We also describe two implementations of these algorithms that are based on the normal equations and corrected seminormal equations (CSNE) approaches. We show that the method of normal equations should be em...
متن کامل